The mainstay of malaria diagnosis has been the microscopic examination of blood, utilizing blood films.[1] Although blood is the sample most frequently used to make a diagnosis, both saliva and urine have been investigated as alternative, less invasive specimens.[2] More recently, modern techniques utilizing antigen tests or polymerase chain reaction have been discovered, though these are not widely implemented in malaria endemic regions.[3][4] Areas that cannot afford laboratory diagnostic tests often use only a history of subjective fever as the indication to treat for malaria.
Contents |
Species | Appearance | Periodicity | Liver persistent |
---|---|---|---|
Plasmodium vivax | tertian | yes | |
Plasmodium ovale | tertian | yes | |
Plasmodium falciparum | tertian | no | |
Plasmodium malariae | quartan | no |
The most economic, preferred, and reliable diagnosis of malaria is microscopic examination of blood films because each of the four major parasite species has distinguishing characteristics. Two sorts of blood film are traditionally used. Thin films are similar to usual blood films and allow species identification because the parasite's appearance is best preserved in this preparation. Thick films allow the microscopist to screen a larger volume of blood and are about eleven times more sensitive than the thin film, so picking up low levels of infection is easier on the thick film, but the appearance of the parasite is much more distorted and therefore distinguishing between the different species can be much more difficult. With the pros and cons of both thick and thin smears taken into consideration, it is imperative to utilize both smears while attempting to make a definitive diagnosis.[5]
From the thick film, an experienced microscopist can detect parasite levels (or parasitemia) as few as 5 parasites/µL blood.[6] Diagnosis of species can be difficult because the early trophozoites ("ring form") of all four species look identical and it is never possible to diagnose species on the basis of a single ring form; species identification is always based on several trophozoites.
One important thing to note is that P. malariae and P. knowlesi (which is the most common cause of malaria in South-east Asia) look very similar under the microscope. However, P. knowlesi parasitemia increases very fast and causes more severe disease than P. malariae, so it is important to identify and treat infections quickly. Therefore modern methods such as PCR (see "Molecular methods" below) or monoclonal antibody panels that can distinguish between the two should be used in this part of the world.[7]
For areas where microscopy is not available, or where laboratory staff are not experienced at malaria diagnosis, there are commercial antigen detection tests that require only a drop of blood.[8] Immunochromatographic tests (also called: Malaria Rapid Diagnostic Tests, Antigen-Capture Assay or "Dipsticks") have been developed, distributed and fieldtested. These tests use finger-stick or venous blood, the completed test takes a total of 15–20 minutes, and the results are read visually as the presence or absence of colored stripes on the dipstick, so they are suitable for use in the field. The threshold of detection by these rapid diagnostic tests is in the range of 100 parasites/µl of blood (commercial kits can range from about 0.002% to 0.1% parasitemia) compared to 5 by thick film microscopy. One disadvantage is that dipstick tests are qualitative but not quantitative – they can determine if parasites are present in the blood, but not how many.
The first rapid diagnostic tests were using P. falciparum glutamate dehydrogenase as antigen.[3] PGluDH was soon replaced by P.falciparum lactate dehydrogenase, a 33 kDa oxidoreductase [EC 1.1.1.27]. It is the last enzyme of the glycolytic pathway, essential for ATP generation and one of the most abundant enzymes expressed by P.falciparum. PLDH does not persist in the blood but clears about the same time as the parasites following successful treatment. The lack of antigen persistence after treatment makes the pLDH test useful in predicting treatment failure. In this respect, pLDH is similar to pGluDH. Depending on which monoclonal antibodies are used, this type of assay can distinguish between all five different species of human malaria parasites, because of antigenic differences between their pLDH isoenzymes.
Molecular methods are available in some clinical laboratories and rapid real-time assays (for example, QT-NASBA based on the polymerase chain reaction)[4] are being developed with the hope of being able to deploy them in endemic areas.
PCR (and other molecular methods) is more accurate than microscopy. However, it is expensive, and requires a specialized laboratory. Moreover, levels of parasitemia are not necessarily correlative with the progression of disease, particularly when the parasite is able to adhere to blood vessel walls. Therefore more sensitive, low-tech diagnosis tools need to be developed in order to detect low levels of parasitemia in the field.[9]
Areas that cannot afford laboratory diagnostic tests often use only a history of subjective fever as the indication to treat for malaria. Using Giemsa-stained blood smears from children in Malawi, one study showed that when clinical predictors (rectal temperature, nailbed pallor, and splenomegaly) were used as treatment indications, rather than using only a history of subjective fevers, a correct diagnosis increased from 2% to 41% of cases, and unnecessary treatment for malaria was significantly decreased.[9]
Fever and septic shock are commonly misdiagnosed as severe malaria in Africa, leading to a failure to treat other life-threatening illnesses. In malaria-endemic areas, parasitemia does not ensure a diagnosis of severe malaria, because parasitemia can be incidental to other concurrent disease. Recent investigations suggest that malarial retinopathy is better (collective sensitivity of 95% and specificity of 90%) than any other clinical or laboratory feature in distinguishing malarial from non-malarial coma.[10]